Library Integration for IMU
Table of Contents
[Sticky]
This section will cover constructing a video stabilizer that uses IMU data for movement estimation.
- Preparing the Video: shows details of how to construct the Image objects regarding the format and timestamps. Each video frame must have a proper format and timestamp with the same reference as the IMU measurements and should be expressed in microseconds.
- Preparing the IMU Measurements: shows details of how to construct the SensorPayload objects regarding the format and timestamps. Each video frame must have a proper format and timestamp with the same reference as the IMU measurements and should be expressed in microseconds. The angles must be in radians/second.
- Measurement Integration: the vector of measurements (SensorPayload) must be integrated and transformed into softened quaternions using one of the Integration algorithms.
- Interpolation: the integrated quaternions must be interpolated and aligned to the timestamp of each frame such that the angle can match the actual position of the camera when the image was captured. In case the image does not have a timestamp, the timestamps must be aligned accordingly before this step.
- Computing the Stabilization: once the rotations are known for the image of interest, the stabilization estimates the distortion the video has suffered throughout the entire movement history. For that reason, we need to apply one stabilization algorithm to determine the distortion of the image.
- Video Undistortion: with the distortion computed, we can undistort the image using any available execution backends.
We also provide an Example Application detailing this process.